CLIP: Cheap Lipschitz Training of Neural Networks

نویسندگان

چکیده

Despite the large success of deep neural networks (DNN) in recent years, most still lack mathematical guarantees terms stability. For instance, DNNs are vulnerable to small or even imperceptible input perturbations, so called adversarial examples, that can cause false predictions. This instability have severe consequences applications which influence health and safety humans, e.g., biomedical imaging autonomous driving. While bounding Lipschitz constant a network improves stability, methods rely on restricting constants each layer gives poor bound for actual constant. In this paper we investigate variational regularization method named CLIP controlling network, easily be integrated into training procedure. We mathematically analyze proposed model, particular discussing impact chosen parameter output network. Finally, numerically evaluate our both nonlinear regression problem MNIST Fashion-MNIST classification databases, compare results with weight approach.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Lipschitz-Margin Training: Scalable Certification of Perturbation Invariance for Deep Neural Networks

High sensitivity of neural networks against malicious perturbations on inputs causes security concerns. We aim to ensure perturbation invariance in their predictions. However, prior work requires strong assumptions on network structures and massive computational costs, and thus their applications are limited. In this paper, based on Lipschitz constants and prediction margins, we present a widel...

متن کامل

Training Large Neural Networks

We describe regularization tools for training large-scale artiicial feed-forward neural networks. We propose algorithms that explicitly use a sequence of Tikhonov regularized nonlinear least squares problems. For large-scale problems, methods using new special purpose automatic diierentiation are used in a conjugate gradient method for computing a truncated Gauss-Newton search direction. The al...

متن کامل

neuralnet: Training of Neural Networks

Artificial neural networks are applied in many situations. neuralnet is built to train multi-layer perceptrons in the context of regression analyses, i.e. to approximate functional relationships between covariates and response variables. Thus, neural networks are used as extensions of generalized linear models. neuralnet is a very flexible package. The backpropagation algorithm and three versio...

متن کامل

rodbar dam slope stability analysis using neural networks

در این تحقیق شبکه عصبی مصنوعی برای پیش بینی مقادیر ضریب اطمینان و فاکتور ایمنی بحرانی سدهای خاکی ناهمگن ضمن در نظر گرفتن تاثیر نیروی اینرسی زلزله ارائه شده است. ورودی های مدل شامل ارتفاع سد و زاویه شیب بالا دست، ضریب زلزله، ارتفاع آب، پارامترهای مقاومتی هسته و پوسته و خروجی های آن شامل ضریب اطمینان می شود. مهمترین پارامتر مورد نظر در تحلیل پایداری شیب، بدست آوردن فاکتور ایمنی است. در این تحقیق ...

Sobolev Training for Neural Networks

At the heart of deep learning we aim to use neural networks as function approximators – training them to produce outputs from inputs in emulation of a ground truth function or data creation process. In many cases we only have access to input-output pairs from the ground truth, however it is becoming more common to have access to derivatives of the target output with respect to the input – for e...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Lecture Notes in Computer Science

سال: 2021

ISSN: ['1611-3349', '0302-9743']

DOI: https://doi.org/10.1007/978-3-030-75549-2_25